39 research outputs found

    Quantile Function-based Models for Resource Utilization and Power Consumption of Applications

    Get PDF
    Server consolidation is currently widely employed in order to improve the energy efficiency of data centers. While being a promising technique, server consolidation may lead to resource interference between applications and thus, reduced performance of applications. Current approaches to account for possible resource interference are not well suited to respect the variation in the workloads for the applications. As a consequence, these approaches cannot prevent resource interference if workload for applications vary. It is assumed that having models for the resource utilization and power consumption of applications as functions of the workload to the applications can improve decision making and help to prevent resource interference in scenarios with varying workload. This thesis aims to develop such models for selected applications. To produce varying workload that resembles statistical properties of real-world workload a workload generator is developed in a first step. Usually, the measurement data for such models origins from different sensors and equipment, all producing data at different frequencies. In order to account for these different frequencies, in a second step this thesis particularly investigates the feasibility to employ quantile functions as model inputs. Complementary, since conventional goodness-of-fit tests are not appropriate for this approach, an alternative to assess the estimation error is presented.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work AppendicesServerkonsolidierung wird derzeit weithin zur Verbesserung der Energieeffizienz von Rechenzentren eingesetzt. Während diese Technik vielversprechende Ergebnisse zeitigt, kann sie zu Ressourceninterferenz und somit zu verringerter Performanz von Anwendungen führen. Derzeitige Ansätze, um dieses Problem zu adressieren, sind nicht gut für Szenarien geeignet, in denen die Workload für die Anwendungen variiert. Als Konsequenz daraus folgt, dass diese Ansätze Ressourceninterferenz in solchen Szenarien nicht verhindern können. Es wird angenommen, dass Modelle für Anwendungen, die deren Ressourenauslastung und die Leistungsaufnahme als Funktion der Workload beschreiben, die Entscheidungsfindung bei der Konsolidierung verbessern und Ressourceninterferenz verhindern können. Diese Arbeit zielt darauf ab, solche Modelle für ausgewählte Anwendungen zu entwickeln. Um variierende Workload zu erzeugen, welche den statistischen Eigenschaften realer Workload folgt, wird zunächst ein Workload-Generator entwickelt. Gewöhnlicherweise stammen Messdaten für die Modelle aus verschienenen Sensoren und Messgeräten, welche jeweils mit unterschiedlichen Frequenzen Daten erzeugen. Um diesen verschiedenen Frequenzen Rechnung zu tragen, untersucht diese Arbeit insbesondere die Möglichkeit, Quantilfunktionen als Eingabeparameter für die Modelle zu verwenden. Da konventionelle Anpassungsgütetests bei diesem Ansatz ungeeignet sind, wird ergänzend eine Alternative vorgestellt, um den durch die Modellierung entstehenden Schätzfehler zu bemessen.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work Appendice

    Drug-coated balloons for small coronary artery disease in patients with chronic kidney disease: a pre-specified analysis of the BASKET-SMALL 2 trial

    Get PDF
    Background Data on the safety and efcacy of drug-coated balloon (DCB) compared to drug-eluting stent (DES) in patients with chronic kidney disease (CKD) are scarce, particularly at long term. This pre-specifed analysis aimed to investigate the 3-year efcacy and safety of DCB versus DES for small coronary artery disease (<3 mm) according to renal function at baseline. Methods BASKET-SMALL-2 was a large multi-center, randomized, controlled trial that tested the efcacy and safety of DCBs (n=382) against DESs (n=376) in small vessel disease. CKD was defned as eGFR<60 ml/min/1.73m2 . The primary endpoint was the composite of cardiac death, non-fatal myocardial infarction, and target vessel revascularization (MACE) during 3 years. Results A total of 174/758 (23%) patients had CKD, out of which 91 were randomized to DCB and 83 to DES implantation. The primary efcacy outcome during 3 years was similar in both, DCB and DES patients (HR 0.98; 95%-CI 0.67–1.44; p=0.937) and patients with and without CKD (HR 1.18; 95%-CI 0.76–1.83; p=0.462), respectively. Rates of cardiac death and all-cause death were signifcantly higher among patients with CKD but not afected by treatment with DCB or DES. Major bleeding events were lower in the DCB when compared to the DES group (12 vs. 3, HR 0.26; 95%-CI 0.07–0.92; p=0.037) and not infuenced by presence of CKD. Conclusions The long-term efcacy and safety of DCB was similar in patients with and without CKD. The use of DCB was associated with signifcantly fewer major bleeding events (NCT 01574534)

    Myelination generates aberrant ultrastructure that is resolved by microglia

    Get PDF
    To enable rapid propagation of action potentials, axons are ensheathed by myelin, a multilayered insulating membrane formed by oligodendrocytes. Most of the myelin is generated early in development, resulting in the generation of long-lasting stable membrane structures. Here, we explored structural and dynamic changes in central nervous system myelin during development. To achieve this, we performed an ultrastructural analysis of mouse optic nerves by serial block face scanning electron microscopy (SBF-SEM) and confocal time-lapse imaging in the zebrafish spinal cord. We found that myelin undergoes extensive ultrastructural changes during early postnatal development. Myelin degeneration profiles were engulfed and phagocytosed by microglia using exposed phosphatidylserine as one “eat me” signal. In contrast, retractions of entire myelin sheaths occurred independently of microglia and involved uptake of myelin by the oligodendrocyte itself. Our findings show that the generation of myelin early in development is an inaccurate process associated with aberrant ultrastructural features that require substantial refinement.</p

    Long-Term Results After Drug-Eluting Versus Bare-Metal Stent Implantation in Saphenous Vein Grafts: Randomized Controlled Trial

    Get PDF
    Background Efficacy data on drug-eluting stents (DES) versus bare-metal stents (BMS) in saphenous vein grafts are controversial. We aimed to compare DES with BMS among patients undergoing saphenous vein grafts intervention regarding long-term outcome. Methods and Results In this multinational trial, patients were randomized to paclitaxel-eluting or BMS. The primary end point was major adverse cardiac events (cardiac death, nonfatal myocardial infarction, and target-vessel revascularization at 1 year. Secondary end points included major adverse cardiac events and its individual components at 5-year follow-up. One hundred seventy-three patients were included in the trial (89 DES versus 84 BMS). One-year major adverse cardiac event rates were lower in DES compared with BMS (2.2% versus 16.0%, hazard ratio, 0.14; 95% CI, 0.03-0.64,; P; =0.01), which was mainly driven by a reduction of subsequent myocardial infarctions and need for target-vessel revascularization. Five-year major adverse cardiac event rates remained lower in the DES compared with the BMS arm (35.5% versus 56.1%, hazard ratio, 0.40; 95% CI, 0.23-0.68,; P; <0.001). A landmark-analysis from 1 to 5 years revealed a persistent benefit of DES over BMS (hazard ratio, 0.33; 95% CI, 0.13-0.74,; P; =0.007) in terms of target-vessel revascularization. More patients in the BMS group underwent multiple target-vessel revascularization procedures throughout the study period compared with the DES group (DES 1.1% [n=1] versus BMS 9.5% [n=8],; P; =0.013). Enrollment was stopped before the target sample size of 240 patients was reached. Conclusions In this randomized controlled trial with prospective long-term follow-up of up to 5 years, DES showed a better efficacy than BMS with sustained benefits over time. DES may be the preferred strategy in this patient population. Registration URL: https://www.clini​caltr​ials.gov; Unique identifier: NCT00595647

    Impact of Insulin-Treated Compared to Non-Insulin-Treated Diabetes Mellitus on Outcome of Percutaneous Coronary Intervention with Drug-Coated Balloons versus Drug-Eluting Stents in De Novo Coronary Artery Disease: The Randomized BASKET-SMALL 2 Trial

    Get PDF
    Background: We evaluated the outcome of PCI of de novo stenosis with drug-coated balloons (DCB) versus drug-eluting stents (DES) in patients with insulin-treated diabetes mellitus (ITDM) versus non-insulin-treated diabetes mellitus (NITDM). Methods: Patients were randomized in the BASKET-SMALL 2 trial to DCB or DES and followed over 3 years for MACE (cardiac death, non-fatal myocardial infarction [MI], and target vessel revascularization [TVR]). Outcome in the diabetic subgroup (n = 252) was analyzed with respect to ITDM or NITDM. Results: In NITDM patients (n = 157), rates of MACE (16.7% vs. 21.9%, hazard ratio [HR] 0.68, 95% confidence interval [CI] 0.29–1.58, p = 0.37), death, non-fatal MI, and TVR (8.4% vs. 14.5%, HR 0.30, 95% CI 0.09–1.03, p = 0.057) were similar between DCB and DES. In ITDM patients (n = 95), rates of MACE (DCB 23.4% vs. DES 22.7%, HR 1.12, 95% CI 0.46–2.74, p = 0.81), death, non-fatal MI, and TVR (10.1% vs. 15.7%, HR 0.64, 95% CI 0.18–2.27, p = 0.49) were similar between DCB and DES. TVR was significantly lower with DCB versus DES in all diabetic patients (HR 0.41, 95% CI 0.18–0.95, p = 0.038). Conclusions: DCB compared to DES for treatment of de novo coronary lesions in diabetic patients was associated with similar rates of MACE and numerically lower need for TVR both for ITDM and NITDM patients

    Drug-coated balloons for small coronary artery disease (BASKET-SMALL 2): an open-label randomised non-inferiority trial

    Get PDF
    Drug-coated balloons (DCB) are a novel therapeutic strategy for small native coronary artery disease. However, their safety and efficacy is poorly defined in comparison with drug-eluting stents (DES).; BASKET-SMALL 2 was a multicentre, open-label, randomised non-inferiority trial. 758 patients with de-novo lesions (&lt;3 mm in diameter) in coronary vessels and an indication for percutaneous coronary intervention were randomly allocated (1:1) to receive angioplasty with DCB versus implantation of a second-generation DES after successful predilatation via an interactive internet-based response system. Dual antiplatelet therapy was given according to current guidelines. The primary objective was to show non-inferiority of DCB versus DES regarding major adverse cardiac events (MACE; ie, cardiac death, non-fatal myocardial infarction, and target-vessel revascularisation) after 12 months. The non-inferiority margin was an absolute difference of 4% in MACE. This trial is registered with ClinicalTrials.gov, number NCT01574534.; Between April 10, 2012, and February 1, 2017, 382 patients were randomly assigned to the DCB group and 376 to DES group. Non-inferiority of DCB versus DES was shown because the 95% CI of the absolute difference in MACE in the per-protocol population was below the predefined margin (-3·83 to 3·93%, p=0·0217). After 12 months, the proportions of MACE were similar in both groups of the full-analysis population (MACE was 7·5% for the DCB group vs 7·3% for the DES group; hazard ratio [HR] 0·97 [95% CI 0·58-1·64], p=0·9180). There were five (1·3%) cardiac-related deaths in the DES group and 12 (3·1%) in the DCB group (full analysis population). Probable or definite stent thrombosis (three [0·8%] in the DCB group vs four [1·1%] in the DES group; HR 0·73 [0·16-3·26]) and major bleeding (four [1·1%] in the DCB group vs nine [2·4%] in the DES group; HR 0·45 [0·14-1·46]) were the most common adverse events.; In small native coronary artery disease, DCB was non-inferior to DES regarding MACE up to 12 months, with similar event rates for both treatment groups.; Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung, Basel Cardiovascular Research Foundation, and B Braun Medical AG

    Myelin insulation as a risk factor for axonal degeneration in autoimmune demyelinating disease

    Get PDF
    Axonal degeneration determines the clinical outcome of multiple sclerosis and is thought to result from exposure of denuded axons to immune-mediated damage. Therefore, myelin is widely considered to be a protective structure for axons in multiple sclerosis. Myelinated axons also depend on oligodendrocytes, which provide metabolic and structural support to the axonal compartment. Given that axonal pathology in multiple sclerosis is already visible at early disease stages, before overt demyelination, we reasoned that autoimmune inflammation may disrupt oligodendroglial support mechanisms and hence primarily affect axons insulated by myelin. Here, we studied axonal pathology as a function of myelination in human multiple sclerosis and mouse models of autoimmune encephalomyelitis with genetically altered myelination. We demonstrate that myelin ensheathment itself becomes detrimental for axonal survival and increases the risk of axons degenerating in an autoimmune environment. This challenges the view of myelin as a solely protective structure and suggests that axonal dependence on oligodendroglial support can become fatal when myelin is under inflammatory attack

    Quantile Function-based Models for Resource Utilization and Power Consumption of Applications

    Get PDF
    Server consolidation is currently widely employed in order to improve the energy efficiency of data centers. While being a promising technique, server consolidation may lead to resource interference between applications and thus, reduced performance of applications. Current approaches to account for possible resource interference are not well suited to respect the variation in the workloads for the applications. As a consequence, these approaches cannot prevent resource interference if workload for applications vary. It is assumed that having models for the resource utilization and power consumption of applications as functions of the workload to the applications can improve decision making and help to prevent resource interference in scenarios with varying workload. This thesis aims to develop such models for selected applications. To produce varying workload that resembles statistical properties of real-world workload a workload generator is developed in a first step. Usually, the measurement data for such models origins from different sensors and equipment, all producing data at different frequencies. In order to account for these different frequencies, in a second step this thesis particularly investigates the feasibility to employ quantile functions as model inputs. Complementary, since conventional goodness-of-fit tests are not appropriate for this approach, an alternative to assess the estimation error is presented.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work AppendicesServerkonsolidierung wird derzeit weithin zur Verbesserung der Energieeffizienz von Rechenzentren eingesetzt. Während diese Technik vielversprechende Ergebnisse zeitigt, kann sie zu Ressourceninterferenz und somit zu verringerter Performanz von Anwendungen führen. Derzeitige Ansätze, um dieses Problem zu adressieren, sind nicht gut für Szenarien geeignet, in denen die Workload für die Anwendungen variiert. Als Konsequenz daraus folgt, dass diese Ansätze Ressourceninterferenz in solchen Szenarien nicht verhindern können. Es wird angenommen, dass Modelle für Anwendungen, die deren Ressourenauslastung und die Leistungsaufnahme als Funktion der Workload beschreiben, die Entscheidungsfindung bei der Konsolidierung verbessern und Ressourceninterferenz verhindern können. Diese Arbeit zielt darauf ab, solche Modelle für ausgewählte Anwendungen zu entwickeln. Um variierende Workload zu erzeugen, welche den statistischen Eigenschaften realer Workload folgt, wird zunächst ein Workload-Generator entwickelt. Gewöhnlicherweise stammen Messdaten für die Modelle aus verschienenen Sensoren und Messgeräten, welche jeweils mit unterschiedlichen Frequenzen Daten erzeugen. Um diesen verschiedenen Frequenzen Rechnung zu tragen, untersucht diese Arbeit insbesondere die Möglichkeit, Quantilfunktionen als Eingabeparameter für die Modelle zu verwenden. Da konventionelle Anpassungsgütetests bei diesem Ansatz ungeeignet sind, wird ergänzend eine Alternative vorgestellt, um den durch die Modellierung entstehenden Schätzfehler zu bemessen.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work Appendice

    Quantile Function-based Models for Resource Utilization and Power Consumption of Applications

    No full text
    Server consolidation is currently widely employed in order to improve the energy efficiency of data centers. While being a promising technique, server consolidation may lead to resource interference between applications and thus, reduced performance of applications. Current approaches to account for possible resource interference are not well suited to respect the variation in the workloads for the applications. As a consequence, these approaches cannot prevent resource interference if workload for applications vary. It is assumed that having models for the resource utilization and power consumption of applications as functions of the workload to the applications can improve decision making and help to prevent resource interference in scenarios with varying workload. This thesis aims to develop such models for selected applications. To produce varying workload that resembles statistical properties of real-world workload a workload generator is developed in a first step. Usually, the measurement data for such models origins from different sensors and equipment, all producing data at different frequencies. In order to account for these different frequencies, in a second step this thesis particularly investigates the feasibility to employ quantile functions as model inputs. Complementary, since conventional goodness-of-fit tests are not appropriate for this approach, an alternative to assess the estimation error is presented.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work AppendicesServerkonsolidierung wird derzeit weithin zur Verbesserung der Energieeffizienz von Rechenzentren eingesetzt. Während diese Technik vielversprechende Ergebnisse zeitigt, kann sie zu Ressourceninterferenz und somit zu verringerter Performanz von Anwendungen führen. Derzeitige Ansätze, um dieses Problem zu adressieren, sind nicht gut für Szenarien geeignet, in denen die Workload für die Anwendungen variiert. Als Konsequenz daraus folgt, dass diese Ansätze Ressourceninterferenz in solchen Szenarien nicht verhindern können. Es wird angenommen, dass Modelle für Anwendungen, die deren Ressourenauslastung und die Leistungsaufnahme als Funktion der Workload beschreiben, die Entscheidungsfindung bei der Konsolidierung verbessern und Ressourceninterferenz verhindern können. Diese Arbeit zielt darauf ab, solche Modelle für ausgewählte Anwendungen zu entwickeln. Um variierende Workload zu erzeugen, welche den statistischen Eigenschaften realer Workload folgt, wird zunächst ein Workload-Generator entwickelt. Gewöhnlicherweise stammen Messdaten für die Modelle aus verschienenen Sensoren und Messgeräten, welche jeweils mit unterschiedlichen Frequenzen Daten erzeugen. Um diesen verschiedenen Frequenzen Rechnung zu tragen, untersucht diese Arbeit insbesondere die Möglichkeit, Quantilfunktionen als Eingabeparameter für die Modelle zu verwenden. Da konventionelle Anpassungsgütetests bei diesem Ansatz ungeeignet sind, wird ergänzend eine Alternative vorgestellt, um den durch die Modellierung entstehenden Schätzfehler zu bemessen.:1 Introduction 2 Thesis Overview 2.1 Testbed 2.2 Contributions and Thesis Structure 2.3 Scope, Assumptions, and Limitations 3 Generation of Realistic Workload 3.1 Statistical Properties of Internet Traffic 3.2 Statistical Properties of Video Server Traffic 3.3 Implementation of Workload Generation 3.4 Summary 4 Models for Resource Utilization and for Power Consumption 4.1 Introduction 4.2 Prior Work 4.3 Test Cases 4.4 Applying Regression To Samples Of Different Length 4.5 Models for Resource Utilization as Function of Request Size 4.6 Models for Power Consumption as Function of Resource Utilization 4.7 Summary 5 Conclusion & Future Work 5.1 Summary 5.2 Future Work Appendice

    The Effect of High-Variability Training on the Perception and Production of French Stops by German Native Speakers

    Get PDF
    We investigated the effect of high-variability training (HVT) on the production and perception of French bilabial voiced and voiceless stops by German native speakers. Stop consonants in the two languages differ with respect to several articulatory and acoustic features. German learners of French (Experiment Group) trained the perception of word-initial bilabial stops spoken by six French native speakers using identification tests, whereas subjects of a Control Group did not perform a training. Additional perception and production tests of French words including bilabial, alveolar, and velar stops in all word positions were performed to capture the impact of HVT. Subjects were found to be quite good at distinguishing voiced and voiceless stops. However, voiceless stops received lower correctness scores than voiced ones and subjects of the Experiment group were able to further increase their scores after training. Results for production are mirror-inverted showing that subjects of the Experiment Group successfully produced longer negative VOT values but did not show an improvement for voiceless stops
    corecore